Kappa Coefficient (κ)

Chance corrected agreement

Authors
Affiliations

Doctor of Physical Therapy

B.S. in Kinesiology

Doctor of Physical Therapy

B.A. in Neuroscience

AKA
  • Cohen’s Kappa
  • Kappa Coefficient

Kappa Coefficient (κ) is the measure of inter-rater agreement for categorical items among two raters.

Kappa (κ) is similar to percent agreement but corrected for chance

(κ) can only be used for 2 categories, otherwise one should use Krippendorff’s alpha.

Calculation

Kappa (κ) =Proportion observed agreementProportion expected chance agreement1Proportion expected chance agreeement

Scoring

Values range from 0.00 (not reliable) to 1.00 (perfectly reliable)

“Rule of thumb” Quick interpretation of reliability
Score Reliability
0.00 – 0.20 Poor
0.21 – 0.40 Fair
0.41 – 0.60 Moderate
0.61 – 0.80 Good
0.81 – 1.00 Excellent

Citation

For attribution, please cite this work as: